# Lightweight Optimization
Website Mistral7b Best Vllm
This is a model based on the Hugging Face Transformers library, optimized using Unsloth. Specific functionalities and use cases require further information.
Large Language Model
Transformers

W
limitedonly41
30
1
Nova 0.5 E1 7B
This model is an efficient fine-tuning model optimized based on the TRL (Transformer Reinforcement Learning) library, focusing on the application of reinforcement learning in Transformer models.
Large Language Model
Transformers

N
oscar128372
46
2
Soongsilbert Base Beep
KoELECTRA is a Korean pre-trained language model based on the ELECTRA architecture, optimized for Korean natural language processing tasks.
Large Language Model Korean
S
jason9693
23
1
Ja Core News Sm
Japanese processing pipeline optimized for CPU, including tokenization, part-of-speech tagging, dependency parsing, named entity recognition, etc.
Sequence Labeling Japanese
J
spacy
60
0
Pt Core News Sm
spaCy's CPU-optimized Portuguese processing pipeline, including tokenization, POS tagging, dependency parsing, named entity recognition, etc.
Sequence Labeling Other
P
spacy
87
2
Distilbert Base Es Multilingual Cased
Apache-2.0
This is a Spanish subset model extracted from distilbert-base-multilingual-cased, a distilled version of the BERT base multilingual model with a smaller parameter size while retaining core functionality.
Large Language Model
Transformers Spanish

D
Recognai
76
3
Featured Recommended AI Models